3,397 research outputs found
Mitigating Branch-Shadowing Attacks on Intel SGX using Control Flow Randomization
Intel Software Guard Extensions (SGX) is a promising hardware-based
technology for protecting sensitive computations from potentially compromised
system software. However, recent research has shown that SGX is vulnerable to
branch-shadowing -- a side channel attack that leaks the fine-grained (branch
granularity) control flow of an enclave (SGX protected code), potentially
revealing sensitive data to the attacker. The previously-proposed defense
mechanism, called Zigzagger, attempted to hide the control flow, but has been
shown to be ineffective if the attacker can single-step through the enclave
using the recent SGX-Step framework.
Taking into account these stronger attacker capabilities, we propose a new
defense against branch-shadowing, based on control flow randomization. Our
scheme is inspired by Zigzagger, but provides quantifiable security guarantees
with respect to a tunable security parameter. Specifically, we eliminate
conditional branches and hide the targets of unconditional branches using a
combination of compile-time modifications and run-time code randomization.
We evaluated the performance of our approach by measuring the run-time
overhead of ten benchmark programs of SGX-Nbench in SGX environment
Space shuttle electromagnetic environment experiment. Phase A: Definition study
A program is discussed which develops a concept for measuring the electromagnetic environment on earth with equipment on board an orbiting space shuttle. Earlier work on spaceborne measuring experiments is reviewed, and emissions to be expected are estimated using, in part, previously gathered data. General relations among system parameters are presented, followed by a proposal on spatial and frequency scanning concepts. The methods proposed include a nadir looking measurement with small lateral scan and a circularly scanned measurement looking tangent to the earth's surface at the horizon. Antenna requirements are given, assuming frequency coverage from 400 MHz to 40 GHz. For the low frequency range, 400-1000 MHz, a processed, thinned array is proposed which will be more fully analyzed in the next phase of the program. Preliminary hardware and data processing requirements are presented
Effect of Testing and Treatment on Emergency Department Length of Stay Using a National Database
Objectives: Testing and treatment are essential aspects of the delivery of emergency care. Recognition of the effects of these activities on emergency department (ED) length of stay (LOS) has implications for administrators planning efficient operations, providers, and patients regarding expectations for length of visit; researchers in creating better models to predict LOS; and policy‐makers concerned about ED crowding. Methods: A secondary analysis was performed using years 2006 through 2008 of the National Hospital Ambulatory Medical Care Survey (NHAMCS), a nationwide study of ED services. In univariate and bivariate analyses, the authors assessed ED LOS and frequency of testing (blood test, urinalysis, electrocardiogram [ECG], radiograph, ultrasound, computed tomography [CT], or magnetic resonance imaging [MRI]) and treatment (providing a medication or performance of a procedure) according to disposition (discharged or admitted status). Two sets of multivariable models were developed to assess the contribution of testing and treatment to LOS, also stratified by disposition. The first was a series of logistic regression models to provide an overview of how testing and treatment activity affects three dichotomized LOS cutoffs at 2, 4, and 6 hours. The second was a generalized linear model (GLM) with a log‐link function and gamma distribution to fit skewed LOS data, which provided time costs associated with tests and treatment. Results: Among 360 million weighted ED visits included in this analysis, 227 million (63%) involved testing, 304 million (85%) involved treatment, and 201 million (56%) involved both. Overall, visits with any testing were associated with longer LOS (median = 196 minutes; interquartile range [IQR] = 125 to 305 minutes) than those with any treatment (median = 159 minutes; IQR = 91 to 262 minutes). This difference was more pronounced among discharged patients than admitted patients. Obtaining a test was associated with an adjusted odds ratio (OR) of 2.29 (95% confidence interval [CI] = 1.86 to 2.83) for experiencing a more than 4‐hour LOS, while performing a treatment had no effect (adjusted OR = 0.84; 95% CI = 0.68 to 1.03). The most time‐costly testing modalities included blood test (adjusted marginal effects on LOS = +72 minutes; 95% CI = 66 to 78 minutes), MRI (+64 minutes; 95% CI = 36 to 93 minutes), CT (+59 minutes; 95% CI = 54 to 65 minutes), and ultrasound (US; +56 minutes; 95% CI = 45 to 67 minutes). Treatment time costs were less substantial: performing a procedure (+24 minutes; 95% CI = 20 to 28 minutes) and providing a medication (+15 minutes; 95% CI = 8 to 21 minutes). Conclusions: Testing and less substantially treatment were associated with prolonged LOS in the ED, particularly for blood testing and advanced imaging. This knowledge may better direct efforts at streamlining delivery of care for the most time‐costly diagnostic modalities or suggest areas for future research into improving processes of care. Developing systems to improve efficient utilization of these services in the ED may improve patient and provider satisfaction. Such practice improvements could then be examined to determine their effects on ED crowding.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/92123/1/j.1553-2712.2012.01353.x.pd
Recurrent Indigestion in a Young Adult
Bochdalek hernias (BHs) arise due to congenital diaphragmatic defect and can result in gross displacement of abdominal tissues into the thorax. Although they are uncommon in occurrence, they usually present as serious respiratory distress in infants. In the adult population, they are asymptomatic and only detected incidentally. In this report, we present the case of a 26-year-old male who acutely presented with severe epigastric pain radiating to the back and deranged vital signs as a result of incorrect previous diagnoses. A large left diaphragmatic hernia containing his pancreatic tail, spleen, stomach and other intra-abdominal organs was confirmed by CT scan, together occupying a third of the hemithorax. Although not common, diagnostics of BHs should be considered in patients presenting with acute abdomen. A plain chest X-ray displaying diminished left diaphragmatic outline or signs of mediastinal shift should raise suspicion. Previous normal chest X-ray can be deceptive and does not rule out a diaphragmatic hernia. Herein, we also review the literature for previously reported acute presentation of 11 similar cases in adults and highlight the value of including BH as one of the differential diagnoses
Surprise and recency in novelty detection in the primate brain
Primates and other animals must detect novel objects. However, the neuronal mechanisms of novelty detection remain unclear. Prominent theories propose that visual object novelty is either derived from the computation of recency (how long ago a stimulus was experienced) or is a form of sensory surprise (stimulus unpredictability). Here, we use high-channel electrophysiology in primates to show that in many primate prefrontal, temporal, and subcortical brain areas, object novelty detection is intertwined with the computations of recency and sensory surprise. Also, distinct circuits could be engaged by expected versus unexpected sensory surprise. Finally, we studied neuronal novelty-to-familiarity transformations during learning across many days. We found a diversity of timescales in neurons\u27 learning rates and between-session forgetting rates, both within and across brain areas, that are well suited to support flexible behavior and learning in response to novelty. Our findings show that novelty sensitivity arises on multiple timescales across single neurons due to diverse but related computations of sensory surprise and recency and shed light on the computational underpinnings of novelty detection in the primate brain
CacheZoom: How SGX Amplifies The Power of Cache Attacks
In modern computing environments, hardware resources are commonly shared, and
parallel computation is widely used. Parallel tasks can cause privacy and
security problems if proper isolation is not enforced. Intel proposed SGX to
create a trusted execution environment within the processor. SGX relies on the
hardware, and claims runtime protection even if the OS and other software
components are malicious. However, SGX disregards side-channel attacks. We
introduce a powerful cache side-channel attack that provides system adversaries
a high resolution channel. Our attack tool named CacheZoom is able to virtually
track all memory accesses of SGX enclaves with high spatial and temporal
precision. As proof of concept, we demonstrate AES key recovery attacks on
commonly used implementations including those that were believed to be
resistant in previous scenarios. Our results show that SGX cannot protect
critical data sensitive computations, and efficient AES key recovery is
possible in a practical environment. In contrast to previous works which
require hundreds of measurements, this is the first cache side-channel attack
on a real system that can recover AES keys with a minimal number of
measurements. We can successfully recover AES keys from T-Table based
implementations with as few as ten measurements.Comment: Accepted at Conference on Cryptographic Hardware and Embedded Systems
(CHES '17
Targeted alignment and end repair elimination increase alignment and methylation measure accuracy for reduced representation bisulfite sequencing data
Background DNA methylation is an important epigenetic modification involved in
many biological processes. Reduced representation bisulfite sequencing (RRBS)
is a cost-effective method for studying DNA methylation at single base
resolution. Although several tools are available for RRBS data processing and
analysis, it is not clear which strategy performs the best and there has not
been much attention to the contamination issue from artificial cytosines
incorporated during the end repair step of library preparation. To address
these issues, we describe a new method, Targeted Alignment and Artificial
Cytosine Elimination for RRBS (TRACE-RRBS), which aligns bisulfite sequence
reads to MSP1 digitally digested reference and specifically removes the end
repair cytosines. We compared this approach on a simulated and a real dataset
with 7 other RRBS analysis tools and Illumina 450 K microarray platform.
Results TRACE-RRBS aligns sequence reads to a small fraction of the genome
where RRBS protocol targets on and was demonstrated as the fastest, most
sensitive and specific tool for the simulated dataset. For the real dataset,
TRACE-RRBS took about the same time as RRBSMAP, a third to a sixth of time
needed for BISMARK and NOVOALIGN. TRACE-RRBS aligned more reads uniquely than
other tools and achieved the highest correlation with 450 k microarray data.
The end repair artificial cytosine removal increased correlation between
nearby CpGs and accuracy of methylation quantification. Conclusions TRACE-RRBS
is fast and more accurate tool for RRBS data analysis. It is freely available
for academic use at http://bioinformaticstools.mayo.edu/
Generating entangled atom-photon pairs from Bose-Einstein condensates
We propose using spontaneous Raman scattering from an optically driven
Bose-Einstein condensate as a source of atom-photon pairs whose internal states
are maximally entangled. Generating entanglement between a particle which is
easily transmitted (the photon) and one which is easily trapped and coherently
manipulated (an ultracold atom) will prove useful for a variety of
quantum-information related applications. We analyze the type of entangled
states generated by spontaneous Raman scattering and construct a geometry which
results in maximum entanglement
Practical Improvements of Profiled Side-Channel Attacks on a Hardware Crypto-Accelerator
Abstract. This article investigates the relevance of the theoretical frame-work on profiled side-channel attacks presented by F.-X. Standaert et al. at Eurocrypt 2009. The analyses consist in a case-study based on side-channel measurements acquired experimentally from a hardwired crypto-graphic accelerator. Therefore, with respect to previous formal analyses carried out on software measurements or on simulated data, the inves-tigations we describe are more complex, due to the underlying chip’s architecture and to the large amount of algorithmic noise. In this dif-ficult context, we show however that with an engineer’s mindset, two techniques can greatly improve both the off-line profiling and the on-line attack. First, we explore the appropriateness of different choices for the sensitive variables. We show that a skilled attacker aware of the regis-ter transfers occurring during the cryptographic operations can select the most adequate distinguisher, thus increasing its success rate. Sec-ond, we introduce a method based on the thresholding of leakage data to accelerate the profiling or the matching stages. Indeed, leveraging on an engineer’s common sense, it is possible to visually foresee the shape of some eigenvectors thereby anticipating their estimation towards their asymptotic value by authoritatively zeroing weak components containing mainly non-informational noise. This method empowers an attacker, in that it saves traces when converging towards correct values of the secret. Concretely, we demonstrate a 5 times speed-up in the on-line phase of the attack.
- …